1,107 research outputs found

    Optimal speed limit for shared-use roadways

    Get PDF
    Motor vehicle crashes are a serious social problem in the United States. Each year a large number of motor vehicle crashes occur and many people are killed or injured, resulting in substantial economic costs. To minimize economic costs, it is necessary to determine optimal speed limits on roadways because of the strong relationship among posted speed limit, crash frequency, and crash injury severity. A comprehensive literature review about the relationship among posted speed limit, crash frequency, and crash injury severity level was conducted. Crash frequency prediction models and crash injury severity models are developed to obtain crash frequency and injury severity of victims in motor vehicle crashes at different posted speed limits. Model tests were also performed to verify the model fitness of data. Crash costs were then calculated based on crash frequency, injury severity level, and unit cost of each severity level. In addition, CORSIM simulation was used under various posted speed limits to obtain parameters related to operational cost. Total cost curves were then built to show the relationship between posted speed limit and total economic cost. Using the developed crash frequency models, injury severity models and CORSIM simulation results, case studies were conducted to determine optimal speed limits on selected roadways. The results determined optimal speed limits on specific roadways on the basis of total cost

    The Geometric Construction of WZW Effective Action in Non-commutative Manifold

    Get PDF
    By constructing close one cochain density Ω12n{\Omega^1}_{2n} in the gauge group space we get WZW effective Lagrangian on high dimensional non-commutative space.Especially consistent anomalies derived from this WZW effective action in non-commutative four-dimensional space coincides with those by L.Bonora etc.Comment: 9 pages, latex, no figure

    Using (microchannel) electrochemical impedance spectroscopy to detect bacterial activities

    Get PDF
    Even though rapid detection of molecular methods has been used in bacterial detection, (automated) liquid culture methods are still the "Gold Standard" for the detection of living bacteria, especially in cases where there may be similar dead bacteria also present in the sample. However, the current liquid culture methods rely on the effects of bacterial metabolism (changes in O2/CO2 levels, pH, etc.) to bring about measurable changes to the suspension. Hence, for slow-growing microorganisms like M. tuberculosis (Mtb), they take a long time (up to 6 weeks) to yield results. To cut the time to positivity, we applied the detection concept of "detection by death" and using our high sensitivity microchannel Electrical Impedance Spectroscopy (m-EIS) method to detect bacteria in various samples rapidly. Viable bacterial cells can be killed in a few hours with very high doses of cidal drugs. m-EIS relies on the fact that high-frequency AC fields cause transient charge-accumulation at live cells' intact membranes. Cell-death in the microchannel causes charges to no-longer be accumulated, and the bulk capacitance of the microchannel will decrease. By picking up this change, we can rapidly detect viable bacterial cells in a testing sample. For viable TB bacteria detection, we use synthetic sputum with gram-positive bacteria, gram-negative bacteria, and M. tuberculosis H37ra. After decontamination (kill all non-TB bacteria), we are able to discern decreases in bulk-capacitance for suspensions containing ~500 CFU/ml of Mtb cells (but not for ~50 CFU/ml). Thus, our turnaround-time (~4 hours) and limit-of-detection ([less than]500 CFU/ml) are comparable to those of GeneXpertTM. Using our m-EIS method, we can also detect Lactic Acid Bacterial (LAB) contamination for ethanol fermentation with a low concentration of 5000 CFU/ml in less than 3 hours. Our detection of limit for LAB has a three-order reduction than the HPLC method, but with a similar detection time. Aside from investigating the bacterial activities in the solution, we also observed the biofilm formation on the electrodes. We have observed that the interfacial capacitance also increases in proportion to the number of cells in the biofilm matrixes. In the future, we believe that we could also record cell death within biofilm using our method, and so researchers will be able to use it to screen new anti-biofilm material and drug rapidly.Includes bibliographical references (pages 87-100)

    Optimizing Urban Distribution Routes for Perishable Foods Considering Carbon Emission Reduction

    Get PDF
    The increasing demand for urban distribution increases the number of transportation vehicles which intensifies the congestion of urban traffic and leads to a lot of carbon emissions. This paper focuses on carbon emission reduction in urban distribution, taking perishable foods as the object. It carries out optimization analysis of urban distribution routes to explore the impact of low carbon policy on urban distribution routes planning. On the base of analysis of the cost components and corresponding constraints of urban distribution, two optimization models of urban distribution route with and without carbon emissions cost are constructed, and fuel quantity related to cost and carbon emissions in the model is calculated based on traffic speed, vehicle fuel quantity and passable time period of distribution. Then an improved algorithm which combines genetic algorithm and tabu search algorithm is designed to solve models. Moreover, an analysis of the influence of carbon tax price is also carried out. It is concluded that in the process of urban distribution based on the actual network information, the path optimization considering the low carbon factor can effectively reduce the distribution process of CO2, and reduce the total cost of the enterprise and society, thus achieving greater social benefits at a lower cost. In addition, the government can encourage low-carbon distribution by rationally adjusting the price of carbon tax to achieve a higher social benefit

    The Simultaneous Interpolation of Target Radar Cross Section in Both the Spatial and Frequency Domains by Means of Legendre Wavelets Model-Based Parameter Estimation

    Get PDF
    The understanding of the target radar cross section (RCS) is significant for target identification and for radar designing and optimization. In this paper, a numerical algorithm for calculating target RCS is presented which is based on Legendre wavelet model-based parameter estimation (LW-MBPE). The Padé rational function fitting model applied for MBPE in the frequency domain is enhanced to include spatial dependence on the numerator and denominator coefficients. This allows the function to interpolate target RCS in both the frequency and spatial domains simultaneously. The combination of Legendre wavelets guarantees the convergence of the algorithm. The method is convergent by increasing the sampling frequency and spatial points. Numerical results are provided to demonstrate the validity and applicability of the new technique

    Social Distance and Information Avoidance in Public Security Events: A Dual Involvement Perspective

    Get PDF
    With the large spread of information thanks to ICT, public security events are increasingly focused on by the public. But meanwhile, the phenomenon of people’s information avoidance in these events still exists and even becomes more prominent. However, existing studies on information avoidance have ignored such an important context (i.e., public security event) and the influence of people’s perceptions of social relationship. To fill the gaps, we develop a model to explore the influence of social distance on information avoidance through two opposite mechanisms from a dual involvement perspective, perceived relevance and negative affect, in the context of public security events. We also consider self-efficacy’s moderating role to identify the boundary conditions. A scenario-based survey with college students was conducted to test the proposed research model. Finally, theoretical contributions and practical implications are discussed

    Intermittent Continuance of Smart Health Devices: A Zone-of-Tolerance Perspective

    Get PDF
    Smart health and wearable devices have recently received widespread attention from practitioners and scholars. However, intermittent continuance behavior of users is considered to be one of the most important reasons hindering the development of smart health. To address this issue, the current study employs the zone-of-tolerance theory to explore the mechanisms through which intermittent continuance is evoked. In particular, this study develops two new constructs (i.e., performance superiority and performance adequacy), and proposes that they affect intermittent continuance via satisfaction and neutral satisfaction, respectively. Results demonstrated that the effects of the two new variables on intermittent continuance of smart health devices had been fully mediated. This study concludes with theoretical and practical implications

    Dynamically Mitigating Data Discrepancy with Balanced Focal Loss for Replay Attack Detection

    Full text link
    It becomes urgent to design effective anti-spoofing algorithms for vulnerable automatic speaker verification systems due to the advancement of high-quality playback devices. Current studies mainly treat anti-spoofing as a binary classification problem between bonafide and spoofed utterances, while lack of indistinguishable samples makes it difficult to train a robust spoofing detector. In this paper, we argue that for anti-spoofing, it needs more attention for indistinguishable samples over easily-classified ones in the modeling process, to make correct discrimination a top priority. Therefore, to mitigate the data discrepancy between training and inference, we propose to leverage a balanced focal loss function as the training objective to dynamically scale the loss based on the traits of the sample itself. Besides, in the experiments, we select three kinds of features that contain both magnitude-based and phase-based information to form complementary and informative features. Experimental results on the ASVspoof2019 dataset demonstrate the superiority of the proposed methods by comparison between our systems and top-performing ones. Systems trained with the balanced focal loss perform significantly better than conventional cross-entropy loss. With complementary features, our fusion system with only three kinds of features outperforms other systems containing five or more complex single models by 22.5% for min-tDCF and 7% for EER, achieving a min-tDCF and an EER of 0.0124 and 0.55% respectively. Furthermore, we present and discuss the evaluation results on real replay data apart from the simulated ASVspoof2019 data, indicating that research for anti-spoofing still has a long way to go.Comment: This work has been accepted by the 25th International Conference on Pattern Recognition (ICPR2020

    A Novel Adaptive Spectrum Noise Cancellation Approach for Enhancing Heartbeat Rate Monitoring in a Wearable Device

    Get PDF
    This paper presents a novel approach, Adaptive Spectrum Noise Cancellation (ASNC), for motion artifacts removal in Photoplethysmography (PPG) signals measured by an optical biosensor to obtain clean PPG waveforms for heartbeat rate calculation. One challenge faced by this optical sensing method is the inevitable noise induced by movement when the user is in motion, especially when the motion frequency is very close to the target heartbeat rate. The proposed ASNC utilizes the onboard accelerometer and gyroscope sensors to detect and remove the artifacts adaptively, thus obtaining accurate heartbeat rate measurement while in motion. The ASNC algorithm makes use of a commonly accepted spectrum analysis approaches in medical digital signal processing, discrete cosine transform, to carry out frequency domain analysis. Results obtained by the proposed ASNC have been compared to the classic algorithms, the adaptive threshold peak detection and adaptive noise cancellation. The mean (standard deviation) absolute error and mean relative error of heartbeat rate calculated by ASNC is 0.33 (0.57) beats·min-1 and 0.65%, by adaptive threshold peak detection algorithm is 2.29 (2.21) beats·min-1 and 8.38%, by adaptive noise cancellation algorithm is 1.70 (1.50) beats·min-1 and 2.02%. While all algorithms performed well with both simulated PPG data and clean PPG data collected from our Verity device in situations free of motion artifacts, ASNC provided better accuracy when motion artifacts increase, especially when motion frequency is very close to the heartbeat rate
    corecore